Strong Consistency of Kernel Regression Estimate

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strong Consistency of Kernel Regression Estimate

In this paper, regression function estimation from independent and identically distributed data is considered. We establish strong pointwise consistency of the famous Nadaraya-Watson estimator under weaker conditions which permit to apply kernels with unbounded support and even not integrable ones and provide a general approach for constructing strongly consistent kernel estimates of regression...

متن کامل

Strong Universal Consistency of Smooth Kernel Regression Estimates

The paper deals with kernel estimates of Nadaraya-Watson type for a regression function with square integrable response variable. For usual bandwidth sequences and smooth nonnegative kernels, e.g., Gaussian and quartic kernels, strong L2-consistency is shown without any further condition on the underlying distribution. The proof uses a Tauberian theorem for Ces~ro summability. Let X be a d-dime...

متن کامل

Consistency and Robustness of Kernel Based Regression

We investigate properties of kernel based regression (KBR) methods which are inspired by the convex risk minimization method of support vector machines. We first describe the relation between the used loss function of the KBR method and the tail of the response variable Y . We then establish a consistency result for KBR and give assumptions for the existence of the influence function. In partic...

متن کامل

Consistency of kernel-based quantile regression

Quantile regression is used in many areas of applied research and business. Examples are actuarial, financial or biometrical applications. We show that a non-parametric generalization of quantile regression based on kernels shares with support vector machines the property of consistency to the Bayes risk. We further use this consistency to prove that the non-parametric generalization approximat...

متن کامل

Kernel Density Based Linear Regression Estimate

For linear regression models with non-normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel den...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Open Journal of Statistics

سال: 2013

ISSN: 2161-718X,2161-7198

DOI: 10.4236/ojs.2013.33020